Tags: multiple outputs, lecture-16, categorical cross-entropy
A multi-class classifier has 4 output nodes with softmax activation. The true label is \(\vec y = (0, 0, 1, 0)\) and the softmax outputs are \(\vec h = (0.1, 0.2, 0.6, 0.1)\).
Compute the categorical cross-entropy loss. Leave your answer in terms of \(\log\).
\(-\log(0.6)\).
By the categorical cross-entropy formula:
Only \(y_3 = 1\) contributes, so the loss is \(-\log(h_3) = -\log(0.6)\).
Tags: multiple outputs, lecture-16, categorical cross-entropy
A multi-class classifier has 3 output nodes with softmax activation. The true label is \(\vec y = (0, 1, 0)\) and the softmax outputs are \(\vec h = (0.3, 0.5, 0.2)\).
Compute the categorical cross-entropy loss. Leave your answer in terms of \(\log\).
\(-\log(0.5) = \log 2\).
By the categorical cross-entropy formula:
Only \(y_2 = 1\) contributes, so the loss is \(-\log(h_2) = -\log(0.5) = \log 2\).
Tags: multiple outputs, lecture-16, categorical cross-entropy
A multi-class classifier has 4 output nodes with softmax activation. The true label is \(\vec y = (1, 0, 0, 0)\) and the softmax outputs are \(\vec h = (0.4, 0.3, 0.2, 0.1)\).
Compute the categorical cross-entropy loss. Leave your answer in terms of \(\log\).
\(-\log(0.4)\).
By the categorical cross-entropy formula:
Only \(y_1 = 1\) contributes, so the loss is \(-\log(h_1) = -\log(0.4)\).